|
Convex minimization, a subfield of optimization, studies the problem of minimizing convex functions over convex sets. The convexity property can make optimization in some sense "easier" than the general case - for example, any local minimum must be a global minimum. Given a real vector space together with a convex, real-valued function : defined on a convex subset of , the problem is to find any point in for which the number is smallest, i.e., a point such that : for all . The convexity of makes the powerful tools of convex analysis applicable. In finite-dimensional normed spaces, the Hahn–Banach theorem and the existence of subgradients lead to a particularly satisfying theory of necessary and sufficient conditions for optimality, a duality theory generalizing that for linear programming, and effective computational methods. Convex minimization has applications in a wide range of disciplines, such as automatic control systems, estimation and signal processing, communications and networks, electronic circuit design, data analysis and modeling, statistics (optimal design), and finance. With recent improvements in computing and in optimization theory, convex minimization is nearly as straightforward as linear programming. Many optimization problems can be reformulated as convex minimization problems. For example, the problem of ''maximizing'' a ''concave'' function ''f'' can be re-formulated equivalently as a problem of ''minimizing'' the function -''f'', which is ''convex''. ==Convex optimization problem== An ''optimization problem'' (also referred to as a ''mathematical programming problem'' or ''minimization problem'') of finding some such that : where is the ''feasible set'' and is the ''objective'', is called ''convex'' if is a closed convex set and is convex on . Alternatively, an optimization problem of the form : is called convex if the functions are convex.〔Boyd/Vandenberghe, p. 7〕 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Convex optimization」の詳細全文を読む スポンサード リンク
|